Goto

Collaborating Authors

 locality-sensitive filtering approach


Falconn++: A Locality-sensitive Filtering Approach for Approximate Nearest Neighbor Search

Neural Information Processing Systems

Falconn++ can filter out potential far away points in any hash bucket before querying, which results in higher quality candidates compared to other hashing-based solutions. Theoretically, Falconn++ asymptotically achieves lower query time complexity than Falconn, an optimal locality-sensitive hashing scheme on angular distance. Empirically, Falconn++ achieves a higher recall-speed tradeoff than Falconn on many real-world data sets. Falconn++ is also competitive with HNSW, an efficient representative of graph-based solutions on high search recall regimes.


A supplementary for the paper Falconn++: A Locality-sensitive Filtering Approach for Approximate Nearest Neighbor Search

Neural Information Processing Systems

For ScaNN, we use the latest version 1.2.6 released on 29 April, 2022. FAISS and coCEOs do though their thread-scaling is not perfect. Table 1: Hnsw takes 13.7 mins to build 5.4GB indexing space. Based on the size of HNSW's index, we tune the number Since the characteristics of the data sets are different, it uses different values of iProbes . We used the suggested parameter provided in ScaNN's GitHub.


Falconn++: A Locality-sensitive Filtering Approach for Approximate Nearest Neighbor Search

Neural Information Processing Systems

Falconn can filter out potential far away points in any hash bucket before querying, which results in higher quality candidates compared to other hashing-based solutions. Theoretically, Falconn asymptotically achieves lower query time complexity than Falconn, an optimal locality-sensitive hashing scheme on angular distance. Empirically, Falconn achieves a higher recall-speed tradeoff than Falconn on many real-world data sets. Falconn is also competitive with HNSW, an efficient representative of graph-based solutions on high search recall regimes.